#sparse language model11/05/2025
Huawei Unveils Pangu Ultra MoE: A 718B-Parameter Sparse LLM Optimized for Ascend NPUs
Huawei has introduced Pangu Ultra MoE, a 718 billion parameter sparse language model optimized for Ascend NPUs using simulation-driven architecture and advanced system-level optimizations to achieve high efficiency and performance.